Domain Adaptation with Pre-trained Transformers for Query-Focused Abstractive Text Summarization
نویسندگان
چکیده
Abstract The Query-Focused Text Summarization (QFTS) task aims at building systems that generate the summary of text document(s) based on given query. A key challenge in addressing this is lack large labeled data for training summarization model. In article, we address by exploring a series domain adaptation techniques. Given recent success pre-trained transformer models wide range natural language processing tasks, utilize such to abstractive summaries QFTS both single-document and multi-document scenarios. For adaptation, apply variety techniques using transformer-based including transfer learning, weakly supervised distant supervision. Extensive experiments six datasets show our proposed approach very effective generating while setting new state-of-the-art result several across set automatic human evaluation metrics.
منابع مشابه
Text Generation for Abstractive Summarization
We have begun work on a framework for abstractive summarization and decided to focus on a module for text generation. For TAC 2010, we thus move away from sentence extraction. Each sentence in the summary we generate is based on a document sentence but it usually contains a smaller amount of information and uses fewer words. The system uses the output of a syntactic parser for a sentence and th...
متن کاملNeural Abstractive Text Summarization
Abstractive text summarization is a complex task whose goal is to generate a concise version of a text without necessarily reusing the sentences from the original source, but still preserving the meaning and the key contents. We address this issue by modeling the problem as a sequence to sequence learning and exploiting Recurrent Neural Networks (RNNs). This work is a discussion about our ongoi...
متن کاملGenerative Adversarial Network for Abstractive Text Summarization
In this paper, we propose an adversarial process for abstractive text summarization, in which we simultaneously train a generative model G and a discriminative model D. In particular, we build the generator G as an agent of reinforcement learning, which takes the raw text as input and predicts the abstractive summarization. We also build a discriminator which attempts to distinguish the generat...
متن کاملQuery-Based Abstractive Summarization Using Neural Networks
In this paper, we present a model for generating summaries of text documents with respect to a query. This is known as querybased summarization. We adapt an existing dataset of news article summaries for the task and train a pointer-generator model using this dataset. The generated summaries are evaluated by measuring similarity to reference summaries. Our results show that a neural network sum...
متن کاملFramework for Abstractive Summarization using Text-to-Text Generation
We propose a new, ambitious framework for abstractive summarization, which aims at selecting the content of a summary not from sentences, but from an abstract representation of the source documents. This abstract representation relies on the concept of Information Items (INIT), which we define as the smallest element of coherent information in a text or a sentence. Our framework differs from pr...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computational Linguistics
سال: 2022
ISSN: ['1530-9312', '0891-2017']
DOI: https://doi.org/10.1162/coli_a_00434